|
In thermodynamics, entropy is commonly associated with the amount of order, disorder, or chaos in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced to the alteration in some way or another of the ''arrangement'' of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression:〔''Mechanical Theory of Heat'' – Nine Memoirs on the development of concept of "Entropy" by Rudolf Clausius ()〕 : where Q = ... and T = ... In the years to follow, Ludwig Boltzmann translated these "alterations" into that of a probabilistic view of order and disorder in gas phase molecular systems. In recent years, in chemistry textbooks there has been a shift away from using the terms "order" and "disorder" to that of the concept of energy dispersion to describe entropy, among other theories. In the 2002 encyclopedia Encarta, for example, ''entropy'' is defined as a thermodynamic property which serves as a measure of how close a system is to equilibrium, as well as a measure of the disorder in the system. 〔Microsoft Encarta 2006. © 1993–2005 Microsoft Corporation. All rights reserved.〕 In the context of entropy, "''perfect internal disorder''" is synonymous with "equilibrium", but since that definition is so far different from the usual definition implied in normal speech, the use of the term in science has caused a great deal of confusion and misunderstanding. Locally, the entropy can be lowered by external action. This applies to machines, such as a refrigerator, where the entropy in the cold chamber is being reduced, and to living organisms. This local decrease in entropy is, however, only possible at the expense of an entropy increase in the surroundings. ==History== This "molecular ordering" entropy perspective traces its origins to molecular movement interpretations developed by Rudolf Clausius in the 1850s, particularly with his 1862 visual conception of molecular disgregation. Similarly, in 1859, after reading a paper on the diffusion of molecules by Clausius, Scottish physicist James Clerk Maxwell formulated the Maxwell distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the first-ever statistical law in physics. In 1864, Ludwig Boltzmann, a young student in Vienna, came across Maxwell’s paper and was so inspired by it that he spent much of his long and distinguished life developing the subject further. Later, Boltzmann, in efforts to develop a kinetic theory for the behavior of a gas, applied the laws of probability to Maxwell's and Clausius' molecular interpretation of entropy so to begin to interpret entropy in terms of order and disorder. Similarly, in 1882 Hermann von Helmholtz used the word "Unordnung" (disorder) to describe entropy. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Entropy (order and disorder)」の詳細全文を読む スポンサード リンク
|